Current:Home > MyFCC to consider rules for AI-generated political ads on TV, radio, but it can't regulate streaming -Keystone Capital Education
FCC to consider rules for AI-generated political ads on TV, radio, but it can't regulate streaming
View
Date:2025-04-17 09:32:10
The head of the Federal Communications Commission introduced a proposal Wednesday that would require political advertisers to disclose when they use AI-generated content in broadcast TV and radio ads.
The proposal, if adopted by the commission, would add a layer of transparency that many lawmakers and artificial intelligence experts have been calling for as rapidly advancing generative AI tools produce lifelike images, videos and audio clips that threaten to mislead voters in the upcoming U.S. election.
But the FCC, the nation's top telecommunications regulator, only has authority over TV, radio and some cable providers. Any new rules would not cover the explosive growth in advertising on digital and streaming platforms.
"As artificial intelligence tools become more accessible, the commission wants to make sure consumers are fully informed when the technology is used," FCC Chair Jessica Rosenworcel said in a statement Wednesday. "Today, I've shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue."
This is the second time this year that the commission has begun taking significant steps to combat the growing use of artificial intelligence tools in political communications. Earlier, the FCC confirmed that AI voice-cloning tools in robocalls are banned under existing law. That decision followed an incident in New Hampshire's primary election when automated calls used voice-cloning software to imitate President Joe Biden in order to dissuade voters from going to the polls.
If adopted, the proposal would ask broadcasters to verify with political advertisers whether their content was generated using AI tools — like text-to-image creators or voice-cloning software. The FCC has authority over political advertising on broadcast channels under the 2002 Bipartisan Campaign Reform Act.
But commissioners would still have to discuss several details, including whether broadcasters would have to disclose AI-generated content in an on-air message or only in the TV or radio station's political files, which are public. They also will be tasked with agreeing on a definition of AI-generated content, a challenge that has become fraught as retouching tools and other AI advancements become increasingly embedded in all kinds of creative software.
Rosenworcel hopes to have the regulations in place before the election.
Jonathan Uriarte, a spokesperson and policy adviser for Rosenworcel, said she is looking to define AI-generated content as that generated using computational technology or machine-based systems, "including, in particular, AI-generated voices that sound like human voices, and AI-generated actors that appear to be human actors." He said her draft definition will likely change through the regulatory process.
The proposal comes as political campaigns already have experimented heavily with generative AI, from building chatbots for their websites to creating videos and images using the technology.
Last year, for example, the RNC released an entirely AI-generated ad meant to show a dystopian future under another Biden administration. It employed fake but realistic photos showing boarded-up storefronts, armored military patrols in the streets and waves of immigrants creating panic.
Political campaigns and bad actors also have weaponized highly realistic images, videos and audio content to scam, mislead and disenfranchise voters. In India's elections, recent AI-generated videos misrepresenting Bollywood stars as criticizing the prime minister exemplify a trend AI experts say is cropping up in democratic elections around the world.
Rob Weissman, president of the advocacy group Public Citizen, said he was glad to see the FCC "stepping up to proactively address threats from artificial intelligence and deepfakes, including especially to election integrity."
He urged the FCC to require on-air disclosure for the public's benefit and chided another agency, the Federal Election Commission, for its delays as it also considers whether to regulate AI-generated deepfakes in political ads.
Rep. Yvette Clarke, a Democrat from New York, said it's time for Congress to act on the spread of online misinformation, which the FCC doesn't have jurisdiction over. She has introduced legislation for disclosure requirements on AI-generated content in online ads.
As generative AI has become more cheap, accessible and easy to use, multiple bipartisan groups of lawmakers have called for legislation to regulate the technology in politics. With just a little over five months until the November elections, they still have not passed any bills.
A bipartisan bill introduced by Sen. Amy Klobuchar, a Democrat from Minnesota, and Sen. Lisa Murkowski, a Republican from Alaska, would require political ads to have a disclaimer if they are made or significantly altered using AI. It would require the Federal Election Commission to respond to violations.
Uriarte said Rosenworcel realizes the FCC's capacity to act on AI-related threats is limited but wants to do what she can ahead of the 2024 election.
"This proposal offers the maximum transparency standards that the commission can enforce under its jurisdiction," Uriarte said. "It is our hope that government agencies and lawmakers can build on this important first step in establishing a transparency standard on the use of AI in political advertising."
- In:
- Joe Biden
- Elections
- Federal Communications Commission
- Politics
- Artificial Intelligence
veryGood! (7737)
Related
- Moving abroad can be expensive: These 5 countries will 'pay' you to move there
- $4M settlement reached with family of man who died in bed bug-infested jail cell
- Trump's arraignment on federal charges: Here's what to expect
- Yankees' Domingo Germán entering treatment for alcohol abuse, placed on restricted list
- US appeals court rejects Nasdaq’s diversity rules for company boards
- $4M settlement reached with family of man who died in bed bug-infested jail cell
- Stock market today: Asia mixed after the US government’s credit rating was cut
- How much money do you need to retire? Americans have a magic number — and it's big.
- Former Danish minister for Greenland discusses Trump's push to acquire island
- Donna Mills on the best moment of my entire life
Ranking
- Where will Elmo go? HBO moves away from 'Sesame Street'
- Can dehydration cause fever? What to know about dehydration and symptoms to watch for
- Trump indictment portrays Pence as crucial figure in special counsel's case
- Los Angeles officials fear wave of evictions after deadline to pay pandemic back rent passes
- Cincinnati Bengals quarterback Joe Burrow owns a $3 million Batmobile Tumbler
- In 'Family Lore,' Elizabeth Acevedo explores 'what makes a good death' through magic, sisterhood
- $2.04B Powerball winner bought $25M Hollywood dream home and another in his hometown
- Jon Gosselin Goes Public With Girlfriend Stephanie Lebo After 2 Years of Dating
Recommendation
Cincinnati Bengals quarterback Joe Burrow owns a $3 million Batmobile Tumbler
New York City train derailment leaves several passengers with minor injuries
Mississippi ex-law enforcement charged with civil rights offenses against 2 Black men during raid
Ex-NFL cornerback Damon Arnette must appear in court for plea deal in felony gun case, judge says
Stamford Road collision sends motorcyclist flying; driver arrested
Keep quiet, put down the phone: Bad behavior in blockbusters sparks theater-etiquette discussion
Family pleads for help in search for missing Georgia mother of 4
Oprah, Meryl Streep and more have donated at least $1 million to help striking actors